4,295 research outputs found

    Shortlisting the influential members of criminal organizations and identifying their important communication channels

    Get PDF
    Low-level criminals, who do the legwork in a criminal organization are the most likely to be arrested, whereas the high-level ones tend to avoid attention. But crippling the work of a criminal organizations is not possible unless investigators can identify the most influential, high-level members and monitor their communication channels. Investigators often approach this task by requesting the mobile phone service records of the arrested low-level criminals to identify contacts, and then they build a network model of the organization where each node denotes a criminal and the edges represent communications. Network analysis can be used to infer the most influential criminals and most important communication channels within the network but screening all the nodes and links in a network is laborious and time consuming. Here we propose a new forensic analysis system called IICCC (Identifying Influential Criminals and their Communication Channels) that can effectively and efficiently infer the high-level criminals and short-list the important communication channels in a criminal organization, based on the mobile phone communications of its members. IICCC can also be used to build a network from crime incident reports. We evaluated IICCC experimentally and compared it with five other systems, confirming its superior prediction performance

    Censor-based cooperative Multi-Antenna Spectrum Sensing with Imperfect Reporting Channels

    Get PDF
    The present contribution proposes a spectrally efficient censor-based cooperative spectrum sensing (C-CSS) approach in a sustainable cognitive radio network that consists of multiple antenna nodes and experiences imperfect sensing and reporting channels. In this context, exact analytic expressions are first derived for the corre- sponding probability of detection, probability of false alarm and sec- ondary throughput, assuming that each secondary user (SU) sends its detection outcome to a fusion center only when it has detected a primary signal. Capitalizing on the findings of the analysis, the effects of critical measures, such as the detection threshold, the number of SUs and the number of employed antennas, on the overall system performance are also quantified. In addition, the optimal detection threshold for each antenna based on the Neyman-Pearson criterion is derived and useful insights are developed on how to maximize the system throughput with a reduced number of SUs. It is shown that the C-CSS approach provides two distinct benefits compared with the conventional sensing approach, i.e., without censoring: i) the sensing tail problem, which exists in imperfect sensing environments, can be mitigated; ii) less SUs are ultimately required to obtain higher secondary throughput, rendering the system more sustainable

    Correlation between fracture surface morphology and toughness in Zr-based bulk metallic glasses

    Get PDF
    Fracture surfaces of Zr-based bulk metallic glasses of various compositions tested in the as-cast and annealed conditions were analyzed using scanning electron microscopy. The tougher samples have shown highly jagged patterns at the beginning stage of crack propagation, and the length and roughness of this jagged pattern correlate well with the measured fracture toughness values. These jagged patterns, the main source of energy dissipation in the sample, are attributed to the formation of shear bands inside the sample. This observation provides strong evidence of significant “plastic zone” screening at the crack tip

    Popularity-based video caching techniques for cache-enabled networks: a survey

    Get PDF
    The proliferation of the mobile Internet and connected devices, which offer a variety of services at different levels of performance is a major challenge for the fifth generation of wireless networks and beyond. Innovative solutions are needed to leverage recent advances in machine storage/memory, context awareness, and edge computing. Cache-enabled networks and techniques such as edge caching are envisioned to reduce content delivery times and traffic congestion in wireless networks. Only a few contents are popular, accounting for the majority of viewers, so caching them reduces the latency and download time. However, given the dynamic nature of user behavior, the integration of popularity prediction into caching is of paramount importance to better network utilization and user satisfaction. In this paper, we first present an overview of caching in wireless networks and then provide a detailed comparison of traditional and popularity-based caching. We discuss the attributes of videos and the evaluation criteria of caching policies. We summarize some of the recent work on proactive caching, focusing on prediction strategies. Finally, we provide insight into the potential opportunities and challenges as well as some open research problems enable the realization of efficient deployment of popularity-based caching as part of the next-generation mobile networks

    Neural Bounding

    Full text link
    Bounding volumes are an established concept in computer graphics and vision tasks but have seen little change since their early inception. In this work, we study the use of neural networks as bounding volumes. Our key observation is that bounding, which so far has primarily been considered a problem of computational geometry, can be redefined as a problem of learning to classify space into free or occupied. This learning-based approach is particularly advantageous in high-dimensional spaces, such as animated scenes with complex queries, where neural networks are known to excel. However, unlocking neural bounding requires a twist: allowing -- but also limiting -- false positives, while ensuring that the number of false negatives is strictly zero. We enable such tight and conservative results using a dynamically-weighted asymmetric loss function. Our results show that our neural bounding produces up to an order of magnitude fewer false positives than traditional methods

    Benefit Assessment of the Integrated Demand Management Concept for Multiple New York Metroplex Airports

    Get PDF
    Benefits of the Integrated Demand Management (IDM) concept were assessed utilizing a newly developed automated simulation capability called Traffic Management Initiative Automated Simulation (TMIAutoSim). The IDM concept focuses on improving traffic flow management (TFM) by coordinating the FAAs strategic Traffic Flow Management System (TFMS) with its more tactical Time-Based Flow Management (TBFM) system. The IDM concept leverages a new TFMS capability called Collaborative Trajectory Options Program (CTOP) to strategically pre-condition traffic demand flowing into a TBFM-managed arrival environment, where TBFM is responsible for tactically managing traffic by generating precise arrival schedules. The IDM concept was developed over a multi-year effort, focusing on solving New York metroplex airport arrival problems. TMIAutoSim closely mimics NASAs high-fidelity simulation capabilities while enabling more data to be collected at higher speed. Using this new capability, the IDM concept was evaluated using realistic traffic across various weather scenarios. Six representative weather days were selected after clustering three months of historical data. For those selected six days, Newark Liberty International Airport (EWR) and LaGuardia Airport (LGA) arrival traffic scenarios were developed. For each selected day, the historical data were analyzed to accurately simulate actual operations and the weather impact of the day. The current day operations and the IDM concept operations were simulated for the same weather scenarios and the results were compared. The selected six days were categorized into two groups: clear weather for days without Ground Delay Programs (GDP) and convective weather for days with GDP and significant weather around New York metroplex airports. For the clear weather scenarios, IDM operations reduced last minute, unanticipated departure delays for short-haul flights within TBFM control boundaries with minimal to no impact on throughput and total delay. For the convective weather scenarios, IDM significantly reduced delays and increased throughput to the destination airports

    Perioperative intraperitoneal chemotherapy for peritoneal surface malignancy

    Get PDF
    The treatment of peritoneal surface malignancy mainly focuses on diffuse malignant peritoneal mesothelioma, pseudomyxoma peritonei from appendiceal cancer, and peritoneal dissemination from gastrointestinal and ovarian cancers. Cancer progression causes peritoneal implants to be distributed throughout the abdominopelvic cavity. These nodules plus the ascitic fluid result in abdominal distension. As the disease progresses, these tumors cause intestinal obstruction leading to debilitating symptoms and a greatly impaired quality of life. In the past, the prognosis of patients with peritoneal surface malignancy was regarded dismal and cure was not an option. Recently, cytoreductive surgery combined with perioperative intraperitoneal chemotherapy has shown an improved survival in selected patients with this disease. To date, multiple different treatment regimens of perioperative intraperitoneal chemotherapy have been used. This review focuses on the perioperative intraperitoneal chemotherapy currently in use in conjunction with cytoreductive surgery for the treatment of peritoneal surface malignancy at the Washington Cancer Institute

    An effective disease risk indicator tool

    Get PDF
    Each mixture of deficient molecular families of a specific disease induces the disease at a different time frame in the future. Based on this, we propose a novel methodology for personalizing a person’s level of future susceptibility to a specific disease by inferring the mixture of his/her molecular families, whose combined deficiencies is likely to induce the disease. We implemented the methodology in a working system called DRIT, which consists of the following components: logic inferencer, information extractor, risk indicator, and interrelationship between molecular families modeler. The information extractor takes advantage of the exponential increase of biomedical literature to extract the common biomarkers that test positive among most patients with a specific disease. The logic inferencer transforms the hierarchical interrelationships between the molecular families of a disease into rule-based specifications. The interrelationship between molecular families modeler models the hierarchical interrelationships between the molecular families, whose biomarkers were extracted by the information extractor. It employs the specification rules and the inference rules for predicate logic to infer as many as possible probable deficient molecular families for a person based on his/her few molecular families, whose biomarkers tested positive by medical screening. The risk indicator outputs a risk indicator value that reflects a person’s level of future susceptibility to the disease. We evaluated DRIT by comparing it experimentally with a comparable method. Results revealed marked improvement
    • …
    corecore